Departure from normality of increasing-dimension martingales
نویسندگان
چکیده
منابع مشابه
Impact of Departure from Normality on theE
This article considers a linear regression model in which some observations on an explanatory variable are missing, and presents three least squares estimators for the regression coeecients vector. One estimator uses complete observations alone while the other two estimators utilize repaired data with nonstochastic and stochastic imputed values for the missing observations. Asymptotic propertie...
متن کاملDeparture from Normality and Eigenvalue Perturbation Bounds
Perturbation bounds for eigenvalues of diagonalizable matrices are derived that do not depend on any quantities associated with the perturbed matrix; in particular the perturbed matrix can be defective. Furthermore, Gerschgorin-like inclusion regions in the Frobenius are derived, as well as bounds on the departure from normality. 1. Introduction. The results in this paper are based on two eigen...
متن کاملAsymptotic normality of quadratic forms with random vectors of increasing dimension
This paper provides sufficient conditions for the asymptotic normality of quadratic forms of averages of random vectors of increasing dimension and improves on conditions found in the literature. Such results are needed in applications of Owen’s empirical likelihood when the number of constraints is allowed to grow with the sample size. In this connection we fix a gap in the proof of Theorem 4....
متن کاملConsequences of Departure from Normality on the Properties of Calibration Estimators
This paper considers the classical and inverse calibration estimators and discusses the consequences of departure from normality of errors on their bias and mean squared error properties when the errors in calibration process are small.
متن کاملPosterior Normality and Reference Priors for Exponential Families with Increasing Dimension
In this article, we study asymptotic normality of the posterior distribution of the natural parameter in an exponential family based on independent and identically distributed (i.i.d.) data, that is, in terms of expected Kullback-Leibler divergence, when the number of parameters p is increasing with the sample size n. We use this to generate an asymptotic expansion of the Shannon mutual informa...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Multivariate Analysis
سال: 2009
ISSN: 0047-259X
DOI: 10.1016/j.jmva.2008.11.004